AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Pre-trained causal model

# Pre-trained causal model

Tibetan Roberta Causal Base
MIT
This is a Tibetan pre-trained causal language model based on the RoBERTa architecture, primarily designed for Tibetan text generation tasks.
Large Language Model Transformers Other
T
sangjeedondrub
156
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase